# Attention mechanism optimization
Eris Lelantacles 7b
Other
Eris-Lelanacles-7b is a large language model obtained by merging two 7B parameter models, Eris-Beach_Day-7b and Lelanta-lake-7b, using the SLERP method
Large Language Model
Transformers

E
ChaoticNeutrals
22
3
Ru Longformer Base 4096
This is a basic Longformer model specifically designed for Russian, supporting a context length of up to 4096 tokens. It is initialized based on the weights of blinoff/roberta-base-russian-v0 and fine-tuned on a Russian book dataset.
Large Language Model
Transformers

R
kazzand
111
1
Wangchanberta Base Att Spm Uncased Tagging
A fine-tuned model based on airesearch/wangchanberta-base-att-spm-uncased, specific purpose not clearly stated
Large Language Model
Transformers

W
bookpanda
41
1
Featured Recommended AI Models